426 research outputs found

    Combining brain-computer interfaces and assistive technologies: state-of-the-art and challenges

    Get PDF
    In recent years, new research has brought the field of EEG-based Brain-Computer Interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely,“Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices

    Etch rates for micromachining processing

    Full text link

    Automated Discrimination of Pathological Regions in Tissue Images: Unsupervised Clustering vs Supervised SVM Classification

    Get PDF
    Recognizing and isolating cancerous cells from non pathological tissue areas (e.g. connective stroma) is crucial for fast and objective immunohistochemical analysis of tissue images. This operation allows the further application of fully-automated techniques for quantitative evaluation of protein activity, since it avoids the necessity of a preventive manual selection of the representative pathological areas in the image, as well as of taking pictures only in the pure-cancerous portions of the tissue. In this paper we present a fully-automated method based on unsupervised clustering that performs tissue segmentations highly comparable with those provided by a skilled operator, achieving on average an accuracy of 90%. Experimental results on a heterogeneous dataset of immunohistochemical lung cancer tissue images demonstrate that our proposed unsupervised approach overcomes the accuracy of a theoretically superior supervised method such as Support Vector Machine (SVM) by 8%

    Testing the Nature of Kaluza-Klein Excitations at Future Lepton Colliders

    Get PDF
    With one extra dimension, current high precision electroweak data constrain the masses of the first Kaluza-Klein excitations of the Standard Model gauge fields to lie above 4\simeq 4 TeV. States with masses not much larger than this should be observable at the LHC. However, even for first excitation masses close to this lower bound, the second set of excitations will be too heavy to be produced thus eliminating the possibility of realizing the cleanest signature for KK scenarios. Previous studies of heavy ZZ' and WW' production in this mass range at the LHC have demonstrated that very little information can be obtained about their couplings to the conventional fermions given the limited available statistics and imply that the LHC cannot distinguish an ordinary ZZ' from the degenerate pair of the first KK excitations of the γ\gamma and ZZ. In this paper we discuss the capability of lepton colliders with center of mass energies significantly below the excitation mass to resolve this ambiguity. In addition, we examine how direct measurements obtained on and near the top of the first excitation peak at lepton colliders can confirm these results. For more than one extra dimension we demonstrate that it is likely that the first KK excitation is too massive to be produced at the LHC.Comment: 38 pages, 10 Figs, LaTex, comments adde

    Phenomenology of flavor-mediated supersymmetry breaking

    Get PDF
    The phenomenology of a new economical SUSY model that utilizes dynamical SUSY breaking and gauge-mediation (GM) for the generation of the sparticle spectrum and the hierarchy of fermion masses is discussed. Similarities between the communication of SUSY breaking through a messenger sector, and the generation of flavor using the Froggatt-Nielsen (FN) mechanism are exploited, leading to the identification of vector-like messenger fields with FN fields, and the messenger U(1) as a flavor symmetry. An immediate consequence is that the first and second generation scalars acquire flavor-dependent masses, but do not violate FCNC bounds since their mass scale, consistent with effective SUSY, is of order 10 TeV. We define and advocate a minimal flavor-mediated model (MFMM), recently introduced in the literature, that successfully accommodates the small flavor-breaking parameters of the standard model using order one couplings and ratios of flavon field vevs. The mediation of SUSY breaking occurs via two-loop log-enhanced GM contributions, as well as several one-loop and two-loop Yukawa-mediated contributions for which we provide analytical expressions. The MFMM is parameterized by a small set of masses and couplings, with values restricted by several model constraints and experimental data. The next-to-lightest sparticle (NLSP) always has a decay length that is larger than the scale of a detector, and is either the lightest stau or the lightest neutralino. Similar to ordinary GM models, the best collider search strategies are, respectively, inclusive production of at least one highly ionizing track, or events with many taus plus missing energy. In addition, D^0 - \bar{D}^0 mixing is also a generic low energy signal. Finally, the dynamical generation of the neutrino masses is briefly discussed.Comment: 54 pages, LaTeX, 8 figure

    Anomalous Heat Conduction and Anomalous Diffusion in Low Dimensional Nanoscale Systems

    Full text link
    Thermal transport is an important energy transfer process in nature. Phonon is the major energy carrier for heat in semiconductor and dielectric materials. In analogy to Ohm's law for electrical conductivity, Fourier's law is a fundamental rule of heat transfer in solids. It states that the thermal conductivity is independent of sample scale and geometry. Although Fourier's law has received great success in describing macroscopic thermal transport in the past two hundreds years, its validity in low dimensional systems is still an open question. Here we give a brief review of the recent developments in experimental, theoretical and numerical studies of heat transport in low dimensional systems, include lattice models, nanowires, nanotubes and graphenes. We will demonstrate that the phonon transports in low dimensional systems super-diffusively, which leads to a size dependent thermal conductivity. In other words, Fourier's law is breakdown in low dimensional structures

    Time-integrated luminosity recorded by the BABAR detector at the PEP-II e+e- collider

    Get PDF
    This article is the Preprint version of the final published artcile which can be accessed at the link below.We describe a measurement of the time-integrated luminosity of the data collected by the BABAR experiment at the PEP-II asymmetric-energy e+e- collider at the ϒ(4S), ϒ(3S), and ϒ(2S) resonances and in a continuum region below each resonance. We measure the time-integrated luminosity by counting e+e-→e+e- and (for the ϒ(4S) only) e+e-→μ+μ- candidate events, allowing additional photons in the final state. We use data-corrected simulation to determine the cross-sections and reconstruction efficiencies for these processes, as well as the major backgrounds. Due to the large cross-sections of e+e-→e+e- and e+e-→μ+μ-, the statistical uncertainties of the measurement are substantially smaller than the systematic uncertainties. The dominant systematic uncertainties are due to observed differences between data and simulation, as well as uncertainties on the cross-sections. For data collected on the ϒ(3S) and ϒ(2S) resonances, an additional uncertainty arises due to ϒ→e+e-X background. For data collected off the ϒ resonances, we estimate an additional uncertainty due to time dependent efficiency variations, which can affect the short off-resonance runs. The relative uncertainties on the luminosities of the on-resonance (off-resonance) samples are 0.43% (0.43%) for the ϒ(4S), 0.58% (0.72%) for the ϒ(3S), and 0.68% (0.88%) for the ϒ(2S).This work is supported by the US Department of Energy and National Science Foundation, the Natural Sciences and Engineering Research Council (Canada), the Commissariat à l’Energie Atomique and Institut National de Physique Nucléaire et de Physiquedes Particules (France), the Bundesministerium für Bildung und Forschung and Deutsche Forschungsgemeinschaft (Germany), the Istituto Nazionale di Fisica Nucleare (Italy), the Foundation for Fundamental Research on Matter (The Netherlands), the Research Council of Norway, the Ministry of Education and Science of the Russian Federation, Ministerio de Ciencia e Innovación (Spain), and the Science and Technology Facilities Council (United Kingdom). Individuals have received support from the Marie-Curie IEF program (European Union) and the A.P. Sloan Foundation (USA)

    Measurement of the B0-anti-B0-Oscillation Frequency with Inclusive Dilepton Events

    Get PDF
    The B0B^0-Bˉ0\bar B^0 oscillation frequency has been measured with a sample of 23 million \B\bar B pairs collected with the BABAR detector at the PEP-II asymmetric B Factory at SLAC. In this sample, we select events in which both B mesons decay semileptonically and use the charge of the leptons to identify the flavor of each B meson. A simultaneous fit to the decay time difference distributions for opposite- and same-sign dilepton events gives Δmd=0.493±0.012(stat)±0.009(syst)\Delta m_d = 0.493 \pm 0.012{(stat)}\pm 0.009{(syst)} ps1^{-1}.Comment: 7 pages, 1 figure, submitted to Physical Review Letter

    Shrinking a large dataset to identify variables associated with increased risk of Plasmodium falciparum infection in Western Kenya

    Get PDF
    Large datasets are often not amenable to analysis using traditional single-step approaches. Here, our general objective was to apply imputation techniques, principal component analysis (PCA), elastic net and generalized linear models to a large dataset in a systematic approach to extract the most meaningful predictors for a health outcome. We extracted predictors for Plasmodium falciparum infection, from a large covariate dataset while facing limited numbers of observations, using data from the People, Animals, and their Zoonoses (PAZ) project to demonstrate these techniques: data collected from 415 homesteads in western Kenya, contained over 1500 variables that describe the health, environment, and social factors of the humans, livestock, and the homesteads in which they reside. The wide, sparse dataset was simplified to 42 predictors of P. falciparum malaria infection and wealth rankings were produced for all homesteads. The 42 predictors make biological sense and are supported by previous studies. This systematic data-mining approach we used would make many large datasets more manageable and informative for decision-making processes and health policy prioritization

    Measurement of the Bottom-Strange Meson Mixing Phase in the Full CDF Data Set

    Get PDF
    We report a measurement of the bottom-strange meson mixing phase \beta_s using the time evolution of B0_s -> J/\psi (->\mu+\mu-) \phi (-> K+ K-) decays in which the quark-flavor content of the bottom-strange meson is identified at production. This measurement uses the full data set of proton-antiproton collisions at sqrt(s)= 1.96 TeV collected by the Collider Detector experiment at the Fermilab Tevatron, corresponding to 9.6 fb-1 of integrated luminosity. We report confidence regions in the two-dimensional space of \beta_s and the B0_s decay-width difference \Delta\Gamma_s, and measure \beta_s in [-\pi/2, -1.51] U [-0.06, 0.30] U [1.26, \pi/2] at the 68% confidence level, in agreement with the standard model expectation. Assuming the standard model value of \beta_s, we also determine \Delta\Gamma_s = 0.068 +- 0.026 (stat) +- 0.009 (syst) ps-1 and the mean B0_s lifetime, \tau_s = 1.528 +- 0.019 (stat) +- 0.009 (syst) ps, which are consistent and competitive with determinations by other experiments.Comment: 8 pages, 2 figures, Phys. Rev. Lett 109, 171802 (2012
    corecore